Convergence rates for Bayesian estimation and testing in monotone regression

نویسندگان

چکیده

Shape restrictions such as monotonicity on functions often arise naturally in statistical modeling. We consider a Bayesian approach to the estimation of monotone regression function and testing for monotonicity. construct prior distribution using piecewise constant functions. For estimation, imposing heights these steps is sensible, but resulting posterior harder analyze theoretically. “projection-posterior” approach, where conjugate normal used, constraint imposed samples by projection map onto space show that contracts at optimal rate n−1∕3 under L1-metric nearly empirical Lp-metrics 0<p≤2. The projection-posterior also computationally more convenient. test hypothesis probability shrinking neighborhood set has universal consistency property obtain separation which ensures power approaches one.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Rates in Nonparametric Bayesian Density Estimation

We consider Bayesian density estimation for compactly supported densities using Bernstein mixtures of beta-densities equipped with a Dirichlet prior on the distribution function. We derive the rate of convergence for α-smooth densities for 0 < α ≤ 2 and show that a faster rate of convergence can be obtained by using fewer terms in the mixtures than proposed before. The Bayesian procedure adapts...

متن کامل

Convergence rates in monotone separable stochastic networks

We study bounds on the rate of convergence to the stationary distribution in monotone separable networks which are represented in terms of stochastic recursive sequences. Monotonicity properties of this subclass of Markov chains allow us to formulate conditions in terms of marginal network characteristics. Two particular examples, generalized Jackson networks and multiserver queues, are conside...

متن کامل

And Convergence Rates for Functional Linear Regression

In functional linear regression, the slope “parameter” is a function. Therefore, in a nonparametric context, it is determined by an infinite number of unknowns. Its estimation involves solving an illposed problem and has points of contact with a range of methodologies, including statistical smoothing and deconvolution. The standard approach to estimating the slope function is based explicitly o...

متن کامل

ADMM for monotone operators: convergence analysis and rates

We propose in this paper a unifying scheme for several algorithms from the literature dedicated to the solving of monotone inclusion problems involving compositions with linear continuous operators in infinite dimensional Hilbert spaces. We show that a number of primaldual algorithms for monotone inclusions and also the classical ADMM numerical scheme for convex optimization problems, along wit...

متن کامل

Convergence rates and asymptotic standard errors for Markov chain Monte Carlo algorithms for Bayesian probit regression

Consider a probit regression problem in which Y1, . . . ,Yn are independent Bernoulli random variables such that Pr.Yi D1/DΦ.xT i β/ where xi is a p-dimensional vector of known covariates that are associated with Yi ,β is a p-dimensional vector of unknown regression coefficients and Φ. / denotes the standard normal distribution function. We study Markov chain Monte Carlo algorithms for explorin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Electronic Journal of Statistics

سال: 2021

ISSN: ['1935-7524']

DOI: https://doi.org/10.1214/21-ejs1861